Dimensionality Reduction on Grassmannian via Riemannian Optimization: A Generalized Perspective
نویسندگان
چکیده
This paper proposes a generalized framework with joint normalization which learns lower-dimensional subspaces with maximum discriminative power by making use of the Riemannian geometry. In particular, we model the similarity/dissimilarity between subspaces using various metrics defined on Grassmannian and formulate dimensionality reduction as a non-linear constraint optimization problem considering the orthogonalization. To obtain the linear mapping, we derive the components required to perform Riemannian optimization (e.g., Riemannian conjugate gradient) from the original Grassmannian through an orthonormal projection. We respect the Riemannian geometry of the Grassmann manifold and search for this projection directly from one Grassmann manifold to another face-to-face without any additional transformations. In this natural geometry-aware way, any metric on the Grassmannmanifold can be resided in our model theoretically. We have combined five metrics with our model and the learning process can be treated as an unconstrained optimization problem on a Grassmann manifold. Experiments on several datasets demonstrate that our approach leads to a significant accuracy gain over state-of-the-art methods.
منابع مشابه
Online Supervised Subspace Tracking
We present a framework for supervised subspace tracking, when there are two time series xt and yt, one being the high-dimensional predictors and the other being the response variables and the subspace tracking needs to take into consideration of both sequences. It extends the classic online subspace tracking work which can be viewed as tracking of xt only. Our online sufficient dimensionality r...
متن کاملGeneralized BackPropagation, Étude De Cas: Orthogonality
This paper introduces an extension of the backpropagation algorithm that enables us to have layers with constrained weights in a deep network. In particular, we make use of the Riemannian geometry and optimization techniques on matrix manifolds to step outside of normal practice in training deep networks, equipping the network with structures such as orthogonality or positive definiteness. Base...
متن کاملAveraging complex subspaces via a Karcher mean approach
We propose a conjugate gradient type optimization technique for the computation of the Karcher mean on the set of complex linear subspaces of fixed dimension, modeled by the so-called Grassmannian. The identification of the Grassmannian with Hermitian projection matrices allows an accessible introduction of the geometric concepts required for an intrinsic conjugate gradient method. In particula...
متن کاملNuclear Norm Regularized Least Squares Optimization on Grassmannian Manifolds
This paper aims to address a class of nuclear norm regularized least square (NNLS) problems. By exploiting the underlying low-rank matrix manifold structure, the problem with nuclear norm regularization is cast to a Riemannian optimization problem over matrix manifolds. Compared with existing NNLS algorithms involving singular value decomposition (SVD) of largescale matrices, our method achieve...
متن کاملTensor Networks for Dimensionality Reduction and Large-scale Optimization: Part 2 Applications and Future Perspectives
Part 2 of this monograph builds on the introduction to tensor networks and their operations presented in Part 1. It focuses on tensor network models for super-compressed higher-order representation of data/parameters and related cost functions, while providing an outline of their applications in machine learning and data analytics. A particular emphasis is on the tensor train (TT) and Hierarchi...
متن کاملذخیره در منابع من
با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید
عنوان ژورنال:
- CoRR
دوره abs/1711.06382 شماره
صفحات -
تاریخ انتشار 2017